专利摘要:
The invention relates to a device for acquiring a 2D image and a depth image, comprising: a first sensor (C1) comprising a front face and a rear face, the first sensor being formed in and on a first semiconductor substrate (100) and comprising a plurality of 2D image pixels (P1) and a plurality of transmissive windows ( F); and a second sensor (C2) having a front face contiguous to the rear face of the first sensor and a rear face opposite the first sensor, the second sensor being formed in and on a second semiconductor substrate (130) and comprising a plurality of pixels in depth (P2) arranged opposite the windows of the first sensor.
公开号:FR3075462A1
申请号:FR1762469
申请日:2017-12-19
公开日:2019-06-21
发明作者:Jerome Vaillant;Yvon Cazaux;Alexis Rochas
申请人:Commissariat a lEnergie Atomique CEA;Commissariat a lEnergie Atomique et aux Energies Alternatives CEA;
IPC主号:
专利说明:

DEVICE FOR ACQUIRING A 2D IMAGE AND A DEPTH IMAGE OF A SCENE
Field
The present application relates to the field of image acquisition devices, and, more particularly, image acquisition devices adapted to acquire a 2D image and a depth image of a scene.
Presentation of the prior art
Image acquisition devices capable of acquiring depth information have been proposed. For example, time of flight detectors (ToF) act to emit a light signal to a scene and then to detect the return light signal reflected by objects in the scene. By calculating the time of flight of the light signal, the distance to the scene object acquisition device can be estimated. For example, the pixels of such a sensor can use SPAD (single photon avalanche diode) type photodiodes.
In some applications, it would be desirable to be able to capture both a 2D image of a scene and an image of corresponding depth of the scene.
While a solution to achieve this objective would be to use separate image sensors to capture the 2D image and the depth image, such a solution is not optimal
B16492 - DD18320 due to the fact that these sensors will have different views of the scene, which leads to a misalignment between the pixels of the corresponding images. In addition, the use of two sensors would increase the size and the cost of the device.
Another solution would be to integrate the pixels of the 2D image and the pixels of depth in the same network of detectors. One problem, however, is that depth pixels generally have significantly larger dimensions than 2D image pixels and / or significantly higher supply voltages than 2D image pixels, which makes such integration complex. .
It would be desirable to be able to have a device for acquiring a 2D image and a depth image of a scene, this device at least partially overcoming one or more of the drawbacks of known devices.
summary
Thus, one embodiment provides a device for acquiring a 2D image and a depth image, comprising:
a first sensor having a front face and a rear face, the first sensor being formed in and on a first semiconductor substrate and comprising a plurality of 2D image pixels and a plurality of transmissive windows; and a second sensor comprising a front face contiguous to the rear face of the first sensor and a rear face opposite the first sensor, the second sensor being formed in and on a second semiconductor substrate and comprising a plurality of depth pixels arranged opposite the windows of the first sensor.
According to one embodiment, each pixel of depth comprises a photodiode of the SPAD type.
According to one embodiment, each pixel of depth comprises several memory areas coupled to the same area of
B16492 - DD18320 detection, and makes it possible to measure a phase difference between an amplitude modulated light signal, emitted by a light source of the device, and a light signal received by the photodetection zone of the pixel, after reflection on a scene which one wishes to acquire a picture.
According to one embodiment, the second sensor comprises an interconnection stack in which electrical connection tracks and / or terminals are formed connecting the depth pixels of the second sensor to a peripheral circuit for controlling and supplying the second sensor, said interconnection stack being disposed on the side of the rear face of the second semiconductor substrate.
According to one embodiment, the first sensor comprises an interconnection stack in which electrical connection tracks and / or terminals are formed connecting the 2D image pixels of the first sensor to a peripheral circuit for controlling and supplying the first sensor, said interconnection stack being arranged on the side of the rear face of the first semiconductor substrate.
According to one embodiment, the interconnection stack of the first sensor does not include metallizations in the transmissive windows of the first sensor.
According to one embodiment, each 2D image pixel of the first sensor comprises a photodiode.
According to one embodiment, the first semiconductor substrate does not include implanted regions located in the transmissive windows of the first sensor.
According to one embodiment, in each pixel of depth of the second sensor, a photodetection zone of the pixel is surrounded by a vertical insulating wall extending over the entire thickness of the second substrate.
According to one embodiment, seen from above, the surface of each pixel of depth surrounded by the vertical insulating wall is greater than the surface of the corresponding transmissive window of the first sensor.
B16492 - DD18320
According to one embodiment, each transmissive window of the first sensor comprises a passband optical filter adapted to transmit light only in a band of width at half height less than 30 nm.
According to one embodiment, each transmissive window of the first sensor comprises a microlens disposed between the rear face of the first semiconductor substrate and the second sensor.
According to one embodiment, the device further comprises a control and processing circuit formed in and on a third semiconductor substrate, said control and processing circuit being attached to the rear face of the second sensor.
According to one embodiment, the control and processing circuit is fixed to the rear face of the second sensor by hybrid molecular bonding.
According to one embodiment, the first and second semiconductor substrates are made of monocrystalline silicon.
According to one embodiment, the first and second sensors are fixed by molecular bonding.
According to one embodiment, the depth pixels are coupled in blocks of several neighboring pixels so as to produce a photo-multiplier.
According to one embodiment, each transmissive window of the first sensor comprises an active pixel, for example an infrared pixel or a visible pixel.
Brief description of the drawings
These characteristics and advantages, as well as others, will be explained in detail in the following description of particular embodiments made without implied limitation in relation to the attached figures, among which:
Figure 1 is a sectional view schematically and partially illustrating an example of an embodiment of a device for acquiring a 2D image and a depth image;
B16492 - DD18320 Figure 2 is a sectional view illustrating in more detail an embodiment of a 2D image sensor of the device of Figure 1;
Figure 3 is a sectional view illustrating in more detail an embodiment of a depth image sensor of the device of Figure 1; and FIG. 4 is a schematic top view showing an example of arrangement of the 2D pixels and the depth pixels in the device of FIG. 1.
detailed description
The same elements have been designated by the same references in the different figures and, moreover, the various figures are not drawn to scale. For the sake of clarity, only the elements useful for understanding the described embodiments have been shown and are detailed. In particular, the production of photodiodes and control circuits of 2D image pixels and depth pixels has not been detailed, the production of such pixels being within the reach of the skilled person from the indications of this description. In the following description, when referring to qualifiers of absolute position, such as the terms up, down, left, right, etc., or relative, such as the terms above, below, upper, lower, etc. ., or to orientation qualifiers, such as the terms horizontal, vertical, etc., reference is made to the orientation of the figures, it being understood that, in practice, the devices described can be oriented differently. Unless specified otherwise, the expressions approximately, substantially, and of the order of mean to 10%, preferably to 5%.
Figure 1 is a sectional view schematically and partially illustrating an example of an embodiment of a device for acquiring a 2D image and a depth image of a scene.
The device of FIG. 1 comprises:
B16492 - DD18320 a first Cl sensor formed in and on a first semiconductor substrate 100, for example a monocrystalline silicon substrate, the Cl sensor comprising a plurality of pixels of 2D PI images and a plurality of windows F distributed over the surface of the sensor ; and a second sensor C2 formed in and on a second semiconductor substrate 130, for example a monocrystalline silicon substrate, the sensor C2 being attached to the rear face of the sensor C1 and comprising a plurality of pixels of depth P2 arranged respectively opposite the windows F of the sensor C1, each pixel of depth P2 comprising a photodiode of the SPAD type.
It will be noted that in the present description, the expression “front face and rear face of an element respectively” means the face of the element intended to be turned towards the scene of which it is desired to acquire an image, and the face of the opposite element. on its front side. In the example of FIG. 1, the front and rear faces of the acquisition device are respectively its upper face and its lower face.
In practice, the device in FIG. 1 is intended to be used in combination with a light source, for example a laser source, emitting light at a determined wavelength or in a determined wavelength range, from preferably a narrow wavelength range, for example a half-height width range less than 3 nm, for example a source of central emission wavelength of the order of 940 nm, for example a source of the type mentioned on the https: //www,sonyemicon.co.jp/products en / laserdiodewld / products / 940tof vcsel.ht ml internet site from the Sony company. By way of example, the range of emission wavelengths of the light source is situated outside the visible range, for example in the near infrared, for example in the range from 700 to 1000 μm. In operation, the light signal produced by the light source is emitted towards the scene (for example via one or more
B16492 - DD18320 lenses), in the form of light pulses, for example periodic pulses. The return light signal reflected by the scene is picked up by the pixels of depth P2 from the sensor C2, so as to measure the time of flight of the light signal at different points on the scene and deduce the distance to the acquisition device at different points. from the scene. The PI pixels of the sensor C1 are able to capture visible light emitted by the scene to form a 2D image of the scene. The windows F of the sensor C1 are transmissive in the emission range of the light source so as to allow the detection of the return light signal by the pixels of depth P2 from the sensor C2. By way of example, the transmission coefficient of the windows F of the sensor C1 in the range of emission wavelengths of the light source is greater than 50%.
In the example shown, each pixel PI of the sensor C1 comprises a photodiode 101 comprising one or more localized implanted regions formed in the semiconductor substrate 100. In this example, the implanted region or regions of the photodiode 101 are arranged on the side of the face rear of the substrate 101. Each pixel PI may further comprise one or more additional components (not shown), for example control transistors, formed on the side of the rear face of the substrate 100, for example in the substrate 100 and on the face rear of the substrate 100. The sensor C1 further comprises an interconnection stack 110, consisting of alternating dielectric and conductive layers coating the rear face of the substrate 100, in which tracks and / or electrical connection terminals 111 are formed connecting the pixels PI of the sensor to a peripheral control and supply circuit, not shown.
In the example shown, the sensor C1 comprises vertical insulation walls 103 passing through the substrate 100 over its entire thickness and delimiting the portions of substrate corresponding respectively to the different windows F of the sensor
B16492 - DD18320
Cl. The vertical insulation walls 103 in particular have an optical insulation function, and may further have an electrical insulation function. By way of example, the vertical insulation walls 103 are made of a dielectric material, for example silicon oxide. Similar insulation walls may also be provided between the pixels PI of the sensor C1.
In the example shown, the substrate 100 of the sensor C1 does not include an implanted region located in the windows F of the sensor Cl, so as to maximize the transparency of the substrate in the windows F. As a variant, in the windows F, the substrate 100 can be replaced by a material which is more transparent at the emission wavelength of the light source used for the distance measurement, for example silicon oxide.
Furthermore, in this example, the interconnection stack 110 of the sensor C1 does not include a metallic element but only dielectric layers in the windows F, so as to maximize the transparency of the interconnection stack in the windows F. The embodiments described are however not limited to this particular case. Alternatively, the interconnection stack 110 may include conductive elements extending over part of the surface of each window F.
The thickness of the substrate 100 is for example between 2 and 10 μm, for example between 3 and 5 μm.
Each window F has for example, when viewed from above, dimensions substantially identical to the dimensions of the pixels PI of the sensor Cl. By way of example, when viewed from above, the largest dimension of each pixel PI or window F of the sensor Cl is less than 10 pm, for example less than 5 pm, for example less than 2 pm, for example of the order of 1 pm.
In the example shown, the front face of the substrate 100 is coated with a passivation layer 115, for example a layer of silicon oxide, a layer of HfOg, a layer of AlgOg, or a stack of several layers of different materials that can have other functions than the only function
B16492 - DD18320 passivation (anti-reflective, filtering, bonding, etc.), extending over substantially the entire surface of the sensor. By way of example, the layer 115 is placed on and in contact with the front face of the substrate 100.
In the example of FIG. 1, the sensor C1 is a 2D color image sensor, that is to say that it comprises pixels PI of different types, adapted to measure light intensities in ranges of lengths distinct visible waves. For this, each pixel PI includes a color filter 118, for example a layer of colored resin, disposed on the side of the front face of the substrate 100, for example on and in contact with the front face of the passivation layer 115, in screw with respect to the photodiode 101 of the pixel. By way of example, the sensor C1 comprises three types of PI pixels, first PI pixels called blue pixels, comprising a color filter 118 preferentially transmitting blue light, second PI pixels called red pixels, comprising a color filter 118 transmitting preferably red light, and third pixels called green pixels, comprising a color filter 118 preferably transmitting green light. In FIG. 1, the different types of pixels are not differentiated. Alternatively, the sensor C1 can be a monochromatic 2D image sensor, in which case the filters 118 can be omitted.
In the example shown, each window F of the sensor C1 comprises a filter 120, for example an interference filter, adapted to transmit light in the range of wavelengths of emission of the light source. Preferably, the filter 120 is adapted to transmit light only in a relatively narrow wavelength band centered on the range of emission wavelengths of the light source of the system, for example a range of wavelengths. half-height width waves less than 30 nm, for example less than 20 nm, for example less than 10 nm. In this example, the filter 120 is placed on the side of the front face of the substrate 100, for example on and in
B16492 - DD18320 contact with the front face of the passivation layer 115, and extends over substantially the entire surface of the window F. The filter 120 makes it possible to avoid unwanted trips of the photodiode of the underlying pixel P2 under the effect of light radiation not coming from the light source of the system. In the example of FIG. 1, the filter 120 is located at the level of the single windows F of the sensor, and arranged on the upper face of the substrate 100, which makes it possible to limit the crosstalk or crosstalk between the pixels PI of the sensor C1. As a variant, the filter 120 may be placed between the substrates 100 and 130, on the underside of the substrate 100 or on the top face of the substrate 130, and extend continuously over substantially all of the surface of the device. In this case, to limit the crosstalk linked to the light reflected (not transmitted and not absorbed) by the filter 120, a metal screen formed in the interconnection stack 110 can be provided at the periphery of each pixel PI of the sensor C1 .
Each pixel PI of the sensor C1 can also comprise a microlens 122 disposed on the side of the front face of the substrate 100, for example on and in contact with the color filter 118 of the pixel, adapted to focus the incident light on the photodiode 101 of the pixel .
In addition, each window F of the sensor C1 can comprise an external microlens 122, for example similar or identical to the microlenses 122 of the pixels PI, arranged on the side of the front face of the substrate 100, for example on and in contact with the filter 120 of the window.
Each window F can also comprise an internal microlens 124 formed in the interconnection stack 110 of the sensor and making it possible, in cooperation with the microlens 122 of the window, to focus the incident light on the photodiode of the underlying pixel P2.
In the example shown, the rear face of the sensor C1 is bonded to the front face of the sensor C2 by molecular bonding. For this, the sensor C1 comprises a layer 126,
B16492 - DD18320 for example made of silicon oxide, coating its rear face. In addition, the sensor C2 comprises a layer 132 of the same kind as the layer 12 6, for example made of silicon oxide, coating its front face. The rear face of the layer 126 is brought into contact with the front face of the layer 132 so as to achieve molecular bonding from the sensor C2 to the sensor C1. By way of example, the layer 126, respectively 132, extends from continuously over the entire surface of the sensor C1, respectively C2.
Each pixel P2 of the sensor C2 comprises a photodiode of the SPAD 133 type formed in the substrate 130, facing the corresponding window F of the sensor C1. The photodiode 133 comprises one or more localized semiconductor regions formed in the semiconductor substrate 130 Each pixel P2 may also comprise one or more additional components (not shown), for example control transistors, formed on the side of the rear face of the substrate 130, for example in the substrate 130 and on the rear face of the substrate 130 The sensor C2 further comprises an interconnection stack 140, consisting of alternating dielectric and conductive layers coating the rear face of the substrate 130, in which tracks and / or electrical connection terminals 141 are formed connecting the pixels P2 of the sensor to a peripheral control and supply circuit, not shown.
By SPAD type photodiode is meant here a photodiode constituted by a PN junction polarized in reverse at a voltage greater than its avalanche threshold. When no electrical charge is present in the depletion zone or space charge zone of the PN junction, the photodiode is in a pseudo-stable, non-conductive state. When a photogenerated electric charge is injected into the depletion zone, if the speed of movement of this charge in the depletion zone is sufficiently high, that is to say if the electric field in the depletion zone is sufficiently intense , the photodiode is likely to enter an avalanche. A single photon is thus capable of generating a measurable electrical signal, and this with a
B16492 - DD18320 very short response time, which is particularly suitable for the flight time measurements that we are trying to achieve. Most of the known structures of SPAD photodiodes can be used in the sensor C2 of FIG. 1, for example structures with surface planar PN junction, structures with buried planar PN junction, or even structures with vertical PN junction, for example such as described in French patent application N ° 16/58513 filed on September 13, 2016 and in the corresponding PCT patent application N ° PCT / FR2017 / 052406 filed on September 11, 2017 (B15154 / DD17140). The prediction of SPAD photodiodes with vertical PN junctions, for example as described in the French and PCT patent applications mentioned above, advantageously makes it possible to limit the active detection area of the pixels P2. This allows the dimensions, in top view, of the pixels P2, and consequently of the windows F, to be relatively small, for example of the same order as the dimensions of the pixels PI, and thus to limit the loss of resolution in the image. 2D resulting from the presence of windows F.
In the example shown, in each pixel P2 of the sensor C2, the photodiode 133 of the pixel is entirely surrounded by a vertical insulating wall 135 crossing the substrate 130 over its entire thickness. The wall 135 has in particular an optical isolation function, and may further have an electrical insulation function. By way of example, the vertical insulation wall 135 is made of a dielectric material, for example silicon oxide. As a variant, the vertical insulation wall 135 is a multilayer wall comprising an inner layer of a dielectric material, for example silicon oxide, one or more intermediate layers comprising at least one metallic layer, and an outer layer made of a dielectric material, for example silicon oxide. The vertical insulating wall 135 is for example situated substantially vertically above the vertical insulating wall 103 surrounding the substrate portion 100 of the corresponding window F of the sensor C1.
B16492 - DD18320 walls 103 and 135 in particular make it possible to limit the risk that light rays received by a pixel PI close to window F will come to activate the photodiode SPAD of the corresponding pixel P2, which could lead to an incorrect depth measurement. In an alternative embodiment, the detection zone of the pixel P2 (and therefore the peripheral wall 135 of the pixel) have, in top view, a surface greater than the surface of the corresponding window F of the sensor C1. This makes it possible to facilitate the alignment of the window F above the pixel P2 during the assembly of the sensors C1 and C2.
It will be noted that a photodiode of the SPAD type is generally associated with auxiliary circuits, in particular a circuit for biasing its PN junction at a voltage greater than its avalanche threshold, a reading circuit adapted to detect the triggering of an avalanche of the photodiode, as well as a quenching circuit whose function is to interrupt the avalanche of the photodiode once it has been triggered. These auxiliary circuits have not been shown in the figures and will not be detailed, the embodiments described being compatible with the auxiliary circuits equipping known SPAD photodiodes. These auxiliary circuits can for example be arranged, at least in part, in and on the rear face of the portions of the substrate 130 situated outside the vertical insulation walls 135 of the pixels.
In the example of FIG. 1, the sensor C2 further comprises a metallic screen 137 covering substantially the entire front face of the substrate 130, with the exception of the portions of substrate 130 located inside the walls 135 (corresponding to the zones of pixel photodetection P2). Here again, the function of the metallic screen 137 is an optical isolation function, aiming to prevent light rays received by a pixel PI close to the window F from activating the photodiode SPAD) of the corresponding pixel P2. As a variant, the screen 137 is not continuous but consists of a plurality of disjointed crowns respectively surrounding, in top view, the photodetection zones of the different pixels P2 of the sensor. An advantage
B16492 - DD18320 is that this makes it possible to limit the stray reflections of light by the screen 137 in the direction of the pixels PI of the sensor C1.
In this example, the silicon oxide layer 132 of bonding from the sensor C2 to the sensor C1 is placed on and in contact with the front face of the metal screen 137 outside the photodetection zones of the pixels P2, and on and in contact with the front face of the substrate 130 in the photodetection areas of the pixels P2.
The thickness of the substrate 130 is for example between 5 and 50 μm, for example between 8 and 20 μm.
It will be noted that the arrangement of the sensors C1 and C2 of the device of FIG. 1 is advantageous in that the interconnection stack 140 of the sensor C2 is located on the side of the substrate 130 of the sensor opposite to the sensor C1. Indeed, a difficulty encountered when trying to co-integrate pixels with conventional photodiode and pixels with SPAD photodiode is that the supply voltage levels required by the two types of pixels are very different, which requires the provision of elements relatively bulky electrical insulation between neighboring pixels of different types. In the example of FIG. 1, the sensors C1 and C2 are naturally electrically isolated at their respective pixel arrays, as well as at their respective control / read circuits. Due to the arrangement of the interconnection stack 140 of the sensor C2 on the side of the substrate 130 opposite to the sensor C1, the risks of breakdown and / or parasitic coupling linked to the potential difference between the conductive power supply tracks of the pixels of sensor C1 and the conductive supply tracks of the pixels of sensor C2 are avoided. By way of example, in the device of FIG. 1, the supply voltage of the pixels P2 of the sensor C2 is at least five times, or even ten times greater than the supply voltage of the pixels PI of the sensor C1.
Optionally, the device of FIG. 1 can also include an additional circuit for monitoring and
B16492 - DD18320 C3 treatment formed in and on a third semiconductor substrate 160, for example a monocrystalline silicon substrate, the circuit C3 being attached to the rear face of the sensor C2. The circuit C3 comprises for example, for each pixel of depth P2 of the sensor, an event dating circuit, for example a circuit of the TDC (Time to Digital Converter time-digital converter) type, making it possible to measure the flight time of the reflected light signal picked up by the pixel. In this example, the circuit C3 comprises components formed on the side of the front face of the substrate 160, in and on the front face of the substrate 160. The circuit C3 also comprises an interconnection stack 170, consisting of dielectric and conductive layers alternating coating the front face of the substrate 160, in which are formed tracks and / or electrical connection terminals 171 connecting the components of the circuit C3 to a peripheral control and supply circuit, not shown.
In this example, the front face of the interconnection stack 170 of the circuit C3 comprises metallic pads of electrical connection directly connected to corresponding metallic pads arranged on the rear face of the interconnection stack 140 of the sensor C2. By way of example, the rear face of the sensor C2 is bonded to the front face of the circuit C3 by hybrid metal-metal / oxide-oxide molecular bonding.
The circuit C3 can also be connected to the sensor C1 by isolated conductive vias (not shown) passing through the sensor C2, located in a peripheral region of the device (at the periphery of the pixel matrices of the sensors C1 and C2).
FIG. 2 is a sectional view illustrating in more detail an example of an embodiment of the 2D image sensor C1 of the device of FIG. 1.
To make this sensor, one starts with a relatively thick semiconductor substrate 100, for example several hundred micrometers thick.
B16492 - DD18320
The implanted regions of the photodiodes 101 and of the possible components for controlling the PI pixels of the sensor are formed from a first face of the substrate, namely its upper face in the orientation of FIG. 2. The vertical insulation walls 103 delimiting, in top view, the windows F of the sensor, are further formed from the upper face of the substrate 100.
The interconnection stack 110 of the sensor C1 is then formed on the upper face of the substrate 100. In this example, the interconnection stack comprises a passivation layer 201, for example made of silicon nitride. The microlens 124 of window F is formed in the upper passivation layer 201. More particularly, to form the microlens 124, a resin structure having substantially the shape of the lens which it is sought to form is formed on the upper face. layer 201, then localized etching opposite the resin structure is implemented so as to transfer the pattern of the resin structure into layer 201. The interconnection stack 110 may also include an upper layer planarization 203 in a material with a refractive index different from that of the layer 201, for example in silicon oxide.
In the example of FIG. 2, the upper face of the interconnection stack 110 is coated with a layer or a stack of layers 205, for example disposed on and in contact with the upper face of the planarization layer 203, forming an anti-reflection device for the emission wavelength of the light source used for the distance measurement. The layer 205 can also be provided to filter the other wavelengths, then fulfilling a function similar to that of the filter 120 of FIG. 1. As a variant, the filtering function passes band at the wavelength of the light source is produced in part by the filter 120 and in part by the layer 205. This makes it possible to relax the constraints of
B16492 - DD18320 filtering performance of filter 120, and thus making a thinner filter 120.
The layer 126 intended to ensure the molecular bonding of the sensor C1 to the sensor C2 is for its part disposed on and in contact with the upper face of the anti-reflective layer 205, it being understood that the layer 12 6 can be taken into account in the dimensioning of the anti-reflective layer 205.
FIG. 3 is a sectional view illustrating in more detail an embodiment of the depth image sensor C2 of the device of FIG. 1.
To make this sensor, one starts with a relatively thick semiconductor substrate 130, for example several hundred micrometers thick.
The SPAD 133 type photodiodes as well as possible components for controlling the pixels P2 of the sensor are formed from a first face of the substrate 130, namely its upper face in the orientation of FIG. 3. The walls of vertical insulation 135 delimiting, in top view, the detection zones of the pixels P2 of the sensor, are further formed from the upper face of the substrate 130.
In the example shown, each SPAD photodiode 133 comprises a first vertical semiconductor region 301 of conductivity type opposite to that of the substrate 130, extending in the substrate 130 from its upper face and towards its lower face, the lateral faces. of the region 301 being in contact with the substrate 130 and the junction between the lateral faces of the region 301 and the substrate 130 defining an avalanche region of the photodiode. By way of example, the region 301 has the shape of a tube with a substantially vertical central axis.
Each SPAD photodiode 133 may further comprise a second horizontal semiconductor region 303 of the same type of conductivity as the region 301, disposed on the side of the upper face of the substrate 130, the upper face of the vertical region 301 being in contact with the lower face. of the horizontal region 303. By way of example, the region 303 has the
B16492 - DD18320 forms a substantially horizontal cover closing the upper end of the tubular region 301.
The horizontal semiconductor region 303 may have a doping level lower than that of the vertical semiconductor region 301, or a doping level substantially equal to that of the vertical semiconductor region 301.
The advantages of such a SPAD photodiode structure are described in French patent applications No. 16/58513 and PCT No. PCT / FR2017 / 052406 mentioned above.
After the production of the SPAD photodiodes 133 and the vertical insulation walls 135, the interconnection stack 140 of the sensor C2 is formed on the upper face of the substrate 130.
The embodiment of the device of FIG. 1 can comprise the following steps:
- realization of the C3 control and processing circuit;
- partial production of the sensor C2 according to the steps described in relation to FIG. 3;
- reversal of the sensor C2 in Figure 3 and hybrid bonding (metal-metal / oxide-oxide) of the upper face (in the orientation of Figure 3) of the sensor C2 to the upper face (in the orientation of the figure 1) of circuit C3;
- thinning of the substrate 130 of the sensor C2 using the circuit C3 as a handle;
- Possible realization of a band pass filter adapted to let pass preferentially the range of emission wavelengths of the light source used to perform the distance measurement;
- Possible realization of the metallic screen 137 on the upper face (in the orientation of FIG. 1) of the thinned substrate 130;
- Partial realization of the sensor C1 according to the steps described in relation to Figure 2;
B16492 - DD18320
- inversion of the sensor Cl in FIG. 2 and bonding of the upper face (in the orientation of FIG. 2) of the sensor Cl to the upper face (in the orientation of FIG. 1) of the thinned substrate 130, without electrical contact between sensors C1 and C2;
- thinning of the substrate 100 of the sensor C1 using the circuit C3 and the sensor C2 as a holding handle; and
- Production of the upper elements of the sensor C1 (passivation layer 115, filters 118 and 120 and microlenses 122 in particular) on the upper face of the thinned substrate 100.
FIG. 4 is a schematic top view of the device of FIG. 1, showing an example of arrangement of the 2D pixels PI and of the pixels of depth P2 in the device of FIG. 1.
In this example, the sensor C1 is a color sensor comprising three distinct types of pixels PI, namely red pixels (R), blue pixels (B) and green pixels (G). The pixels PI are distributed in a matrix according to rows and columns, for example according to a Bayer pattern. Periodically, a pixel PI of the matrix is replaced by a window F (one pixel out of 4 in the direction of the rows and one pixel out of 4 in the direction of the columns in the example shown) surmounting a pixel P2 of the sensor C2. The vertical insulation wall 135 delimiting the detection zone of each pixel P2 has been shown in dotted lines in FIG. 4. In this example, in top view, the dimensions of the detection zones of the pixels P2 of the sensor C2 are greater than the dimensions of the windows F of the sensor Cl. This makes it possible to facilitate the alignment of the sensor Cl with respect to the sensor C2 during the production of the device.
In the device of FIG. 1, the pixels P2 of depth can be controlled individually so as to produce an image of depth of resolution equal to the number of pixels P2 of the sensor C2.
B16492 - DD18320
As a variant, the pixels P2 can be coupled in blocks of several neighboring pixels, for example blocks of three by three neighboring pixels P2 so as to produce a photomultiplier, for example of the SIPM type. We then plan to retain only the correlated events within each block. In other words, only the events detected simultaneously by several pixels of the block will be retained to construct the depth image. The resolution of the depth image is then less than the number of pixels P2 of the sensor C2, but the noise immunity of the depth image sensor is thereby improved.
It will be noted that, depending on the application considered, the rate of acquisition of 2D images by the sensor C1 may be different from the rate of acquisition of the depth images by the sensor C2.
Particular embodiments have been described. Various variants and modifications will appear to those skilled in the art. In particular, examples of embodiments have been described in which the bonding between the sensors C1 and C2 is a molecular bonding of an oxide layer 126 of the sensor C1 on an oxide layer 132 of the sensor C2. However, the embodiments are not limited to this particular case. As a variant, the layers 12 6 and 132 may be metallic layers, the bonding produced then being a metal-metal molecular bonding. In this case, the metal layers 12 6 and 132 may include openings facing the windows F of the sensor Cl. The stack of layers 126 and 132 then forms a metal screen covering substantially the entire front face of the substrate 130 of the sensor C2 with the exception of the detection zones of the pixels P2, this screen being able to replace the screen 137 in FIG. 1.
Furthermore, examples of embodiment have been described above in which each pixel of depth P2 of the sensor C2 comprises a photodiode of the SPAD type. However, the embodiments described are not limited to this particular case. As a variant, the depth pixel can be produced in any other technology suitable for implementing a measurement of
B16492 - DD18320 time of flight of a light signal emitted by a light source and reflected by the scene. By way of example, the depth pixel can be a lock-in type pixel, as described in French patent applications No. 16/62341 (DD17552 / B15537) and
N ° 16/62340 (DD17326 / B15344) previously filed by the applicant, that is to say a pixel comprising several memory zones coupled to the same detection zone, and making it possible to measure a phase shift between a light signal modulated in amplitude , emitted by the light source, and a light signal received by the photodetection zone of the pixel, after reflection on the scene.
In addition, examples of embodiments have been described above in which the transmissive windows F formed in the sensor C1 do not include an active component. However, the embodiments described are not limited to this particular case. As a variant, provision may be made to have, in each window F of the sensor C1, an active pixel, for example an infrared pixel, or a visible pixel.
By way of example, provision may be made to include in each window F an infrared pixel. The infrared pixels provided in the transmissive windows F make it possible, for example, to carry out depth measurements by means of a light source applying structured lighting to the scene. For example, the depth measurement using structured light and the infrared pixels of the sensor C1 can be used to measure relatively short distances, for example less than 5 meters, while the depth measurement by time measurement of vol, using the pixels P2 of the sensor C2, can be used to measure relatively long distances, for example greater than 5 meters.
B16492 - DD18320
权利要求:
Claims (18)
[1" id="c-fr-0001]
1. Device for acquiring a 2D image and a depth image, comprising:
a first sensor (C1) comprising a front face and a rear face, the first sensor being formed in and on a first semiconductor substrate (100) and comprising a plurality of 2D image pixels (PI) and a plurality of transmissive windows ( F); and a second sensor (C2) comprising a front face contiguous to the rear face of the first sensor and a rear face opposite the first sensor, the second sensor being formed in and on a second semiconductor substrate (130) and comprising a plurality of pixels depth (P2) arranged opposite the windows of the first sensor.
[2" id="c-fr-0002]
2. Device according to claim 1, in which each pixel of depth comprises a photodiode of the SPAD type (133).
[3" id="c-fr-0003]
3. Device according to claim 1, in which each pixel of depth comprises several memory zones coupled to the same detection zone, and makes it possible to measure a phase shift between a light signal modulated in amplitude, emitted by a light source of the device, and a light signal received by the photodetection zone of the pixel, after reflection on a scene of which it is desired to acquire an image.
[4" id="c-fr-0004]
4. Device according to any one of claims 1 to 3, in which the second sensor (C2) comprises an interconnection stack (140) in which tracks and / or electrical connection terminals (141) are formed connecting the pixels depth (P2) of the second sensor to a peripheral circuit for controlling and supplying the second sensor, said interconnection stack (140) being arranged on the side of the rear face of the second semiconductor substrate (130).
[5" id="c-fr-0005]
5. Device according to any one of claims 1 to 4, in which the first sensor (Cl) comprises an interconnection stack (110) in which tracks and / or are formed.
B16492 - DD18320 electrical connection terminals (111) connecting the 2D image pixels (PI) of the first sensor to a peripheral control and supply circuit of the first sensor, said interconnection stack (110) being arranged on the side of the rear face of the first semiconductor substrate (100).
[6" id="c-fr-0006]
6. Device according to claim 5, wherein the interconnection stack (110) of the first sensor (Cl) does not include metallizations in the transmissive windows (F) of the first sensor (Cl).
[7" id="c-fr-0007]
7. Device according to any one of claims 1 to 6, wherein each 2D image pixel (PI) of the first sensor comprises a photodiode (101).
[8" id="c-fr-0008]
8. Device according to any one of claims 1 to 7, in which the first semiconductor substrate (100) does not include implanted regions located in the transmissive windows (F) of the first sensor (Cl).
[9" id="c-fr-0009]
9. Device according to any one of claims 1 to 8, in which, in each pixel of depth (P2) of the second sensor (C2), a photodetection zone of the pixel is surrounded by a vertical insulating wall (135) extending over the entire thickness of the second substrate (130).
[10" id="c-fr-0010]
10. Device according to claim 9, in which, in top view, the surface of each pixel of depth (P2) surrounded by the vertical insulating wall (135) is greater than the surface of the corresponding transmissive window (F) of the first sensor.
[11" id="c-fr-0011]
11. Device according to any one of claims 1 to 10, in which each transmissive window (F) of the first sensor (Cl) comprises an optical bandpass filter (120) adapted to transmit light only in a band of width at mid-height less than 30 nm.
[12" id="c-fr-0012]
12. Device according to any one of claims 1 to 11, in which each transmissive window (F) of the first sensor (Cl) comprises a microlens (124) disposed between the
B16492 - DD18320 rear face of the first semiconductor substrate (110) and the second sensor (C2).
[13" id="c-fr-0013]
13. Device according to any one of claims 1 to 12, further comprising a control and processing circuit (C3) formed in and on a third semiconductor substrate (160), said control and processing circuit (C3) being attached to the rear face of the second sensor (C2).
[14" id="c-fr-0014]
14. Device according to any one of claims 1 to 13, in which the control and processing circuit (C3) is fixed to the rear face of the second sensor (C2) by hybrid molecular bonding.
[15" id="c-fr-0015]
15. Device according to any one of claims 1 to 14, in which the first (100) and second (130) semiconductor substrates are made of monocrystalline silicon.
[16" id="c-fr-0016]
16. Device according to any one of claims 1 to 15, in which the first (Cl) and second (C2) sensors are fixed by molecular bonding.
[17" id="c-fr-0017]
17. Device according to any one of claims 1 to 16, in which the depth pixels (P2) are coupled in blocks of several neighboring pixels so as to produce a photo-multiplier.
[18" id="c-fr-0018]
18. Device according to any one of claims 1 to 17, in which each transmissive window (F) of the first sensor (Cl) comprises an active pixel, for example an infrared pixel or a visible pixel.
类似技术:
公开号 | 公开日 | 专利标题
EP3503192B1|2022-02-09|Device for acquiring a 2d image and a depth image of a scene
EP3098858B1|2017-10-25|Photodetector with high quantum efficiency
EP2636067B1|2017-12-06|Detector of visible and near-infrared radiation
EP2786412B1|2021-04-14|Optical detector unit
FR2966978A1|2012-05-04|VISIBLE AND NEAR INFRARED RADIATION DETECTOR
FR2888989A1|2007-01-26|IMAGE SENSOR
US20210273120A1|2021-09-02|Photodetectors, preparation methods for photodetectors, photodetector arrays, and photodetection terminals
FR3082322A1|2019-12-13|IMAGE SENSORS INCLUDING AN INTERFERENTIAL FILTER MATRIX
US11152411B2|2021-10-19|Resonant cavity enhanced image sensor
EP2840617B1|2017-11-29|BSI photodiode with high quantum efficiency
FR3040536A1|2017-03-03|IMAGE SENSOR WITH REDUCED SPECTRAL AND OPTICAL DIAPHOTIE
WO2007088267A2|2007-08-09|Light emission device with chromatic control
EP2851955A1|2015-03-25|Method for fabricating an optical filter in an integrated circuit, and corresponding integrated circuit
FR3001578A1|2014-08-01|PHOTODIODE MATRIX WITH DOPED ZONE ABSORBING THE LOADS
WO2015004235A1|2015-01-15|Semi-transparent photo-detector having a structured p-n junction
EP1432044A1|2004-06-23|Photodiode with an integrated polysilicon filter
FR3108783A1|2021-10-01|Device for acquiring a 2D image and a depth image of a scene
FR2908880A1|2008-05-23|Light beams' phase difference detecting device e.g. micro-interferometer, has amplitude network for generating field of interference between light beams and comprising succession of opaque zones and transparent zones
WO2014091093A1|2014-06-19|Multispectrum visible and infrared monolithic imager
FR3026227A1|2016-03-25|DEVICE FOR ACQUIRING 3D IMAGES
FR3111421A1|2021-12-17|Depth chart sensor
FR3112425A1|2022-01-14|Image sensors comprising an array of interference filters
FR3100926A1|2021-03-19|Image sensor made in sequential 3D technology
FR3091023A1|2020-06-26|Image sensor
EP0923141A1|1999-06-16|Photodetecting device, process of manufacturing and use for multispectral detection
同族专利:
公开号 | 公开日
EP3503192A1|2019-06-26|
FR3075462B1|2020-03-27|
EP3503192B1|2022-02-09|
US20190191067A1|2019-06-20|
US11076081B2|2021-07-27|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20130234029A1|2012-03-06|2013-09-12|Omnivision Technologies, Inc.|Image sensor for two-dimensional and three-dimensional image capture|
US20150054962A1|2013-08-23|2015-02-26|Aptina Imaging Corporation|Imaging systems with stacked image sensors|
FR3026227A1|2014-09-18|2016-03-25|Commissariat Energie Atomique|DEVICE FOR ACQUIRING 3D IMAGES|
US20100157117A1|2008-12-18|2010-06-24|Yu Wang|Vertical stack of image sensors with cutoff color filters|
US8163581B1|2010-10-13|2012-04-24|Monolith IC 3D|Semiconductor and optoelectronic devices|
KR101399338B1|2011-08-08|2014-05-30|실리콘화일|stacking substrate image sensor with dual sensing|
KR20140009774A|2012-07-13|2014-01-23|삼성전자주식회사|3d image sensor and system including the same|
US20140138520A1|2012-11-21|2014-05-22|Taiwan Semiconductor Manufacturing Company, Ltd.|Dual-Side Illumination Image Sensor Chips and Methods for Forming the Same|
US9105550B2|2013-01-11|2015-08-11|Digimarc Corporation|Next generation imaging methods and systems|
CN107078138B|2014-10-06|2020-12-18|索尼公司|Solid-state image pickup device and electronic apparatus|
US9508681B2|2014-12-22|2016-11-29|Google Inc.|Stacked semiconductor chip RGBZ sensor|
FR3056332A1|2016-09-21|2018-03-23|Stmicroelectronics Sas|DEVICE COMPRISING A 2D IMAGE SENSOR AND A DEPTH SENSOR|
US9818791B1|2016-10-04|2017-11-14|Omnivision Technologies, Inc.|Stacked image sensor|US10985203B2|2018-10-10|2021-04-20|Sensors Unlimited, Inc.|Sensors for simultaneous passive imaging and range finding|
JP2020123930A|2019-01-31|2020-08-13|キヤノン株式会社|Imaging device|
US11251219B2|2020-03-10|2022-02-15|Sensors Unlimited, Inc.|Low capacitance photo detectors|
FR3108783A1|2020-03-24|2021-10-01|Commissariat A L'energie Atomique Et Aux Energies Alternatives|Device for acquiring a 2D image and a depth image of a scene|
法律状态:
2018-12-31| PLFP| Fee payment|Year of fee payment: 2 |
2019-06-21| PLSC| Publication of the preliminary search report|Effective date: 20190621 |
2019-12-31| PLFP| Fee payment|Year of fee payment: 3 |
2020-12-28| PLFP| Fee payment|Year of fee payment: 4 |
2021-12-31| PLFP| Fee payment|Year of fee payment: 5 |
优先权:
申请号 | 申请日 | 专利标题
FR1762469|2017-12-19|
FR1762469A|FR3075462B1|2017-12-19|2017-12-19|DEVICE FOR ACQUIRING A 2D IMAGE AND A DEPTH IMAGE OF A SCENE|FR1762469A| FR3075462B1|2017-12-19|2017-12-19|DEVICE FOR ACQUIRING A 2D IMAGE AND A DEPTH IMAGE OF A SCENE|
EP18208212.3A| EP3503192B1|2017-12-19|2018-11-26|Device for acquiring a 2d image and a depth image of a scene|
US16/208,712| US11076081B2|2017-12-19|2018-12-04|Device for acquiring a 2D image and a depth image of a scene|
[返回顶部]